6 research outputs found

    ISIPTA'07: Proceedings of the Fifth International Symposium on Imprecise Probability: Theories and Applications

    Get PDF
    B

    Necessary and Sufficient Explanations for Argumentation-Based Conclusions

    No full text
    In this paper, we discuss necessary and sufficient explanations – the question whether and why a certain argument or claim can be accepted (or not) – for abstract and structured argumentation. Given a framework with which explanations for argumentation-based conclusions can be derived, we study necessity and sufficiency: what (sets of) arguments are necessary or sufficient for the (non-)acceptance of an argument or claim? We will show that necessary and sufficient explanations can be strictly smaller than minimal explanations, while still providing all the reasons for a conclusion and we discuss their usefulness in a real-life application

    Persuasive contrastive explanations for Bayesian networks

    No full text
    Explanation in Artificial Intelligence is often focused on providing reasons for why a model under consideration and its outcome are correct. Recently, research in explainable machine learning has initiated a shift in focus on including so-called counterfactual explanations. In this paper we propose to combine both types of explanation in the context of explaining Bayesian networks. To this end we introduce persuasive contrastive explanations that aim to provide an answer to the question Why outcome t instead of t′? posed by a user. In addition, we propose an algorithm for computing persuasive contrastive explanations. Both our definition of persuasive contrastive explanation and the proposed algorithm can be employed beyond the current scope of Bayesian networks

    Necessary and Sufficient Explanations for Argumentation-Based Conclusions

    No full text
    In this paper, we discuss necessary and sufficient explanations – the question whether and why a certain argument or claim can be accepted (or not) – for abstract and structured argumentation. Given a framework with which explanations for argumentation-based conclusions can be derived, we study necessity and sufficiency: what (sets of) arguments are necessary or sufficient for the (non-)acceptance of an argument or claim? We will show that necessary and sufficient explanations can be strictly smaller than minimal explanations, while still providing all the reasons for a conclusion and we discuss their usefulness in a real-life application

    Persuasive contrastive explanations for Bayesian networks

    No full text
    Explanation in Artificial Intelligence is often focused on providing reasons for why a model under consideration and its outcome are correct. Recently, research in explainable machine learning has initiated a shift in focus on including so-called counterfactual explanations. In this paper we propose to combine both types of explanation in the context of explaining Bayesian networks. To this end we introduce persuasive contrastive explanations that aim to provide an answer to the question Why outcome t instead of t′? posed by a user. In addition, we propose an algorithm for computing persuasive contrastive explanations. Both our definition of persuasive contrastive explanation and the proposed algorithm can be employed beyond the current scope of Bayesian networks
    corecore